Wilsonville
Robotic Arm Platform for Multi-View Image Acquisition and 3D Reconstruction in Minimally Invasive Surgery
Saikia, Alexander, Di Vece, Chiara, Bonilla, Sierra, He, Chloe, Magbagbeola, Morenike, Mennillo, Laurent, Czempiel, Tobias, Bano, Sophia, Stoyanov, Danail
Minimally invasive surgery (MIS) offers significant benefits such as reduced recovery time and minimised patient trauma, but poses challenges in visibility and access, making accurate 3D reconstruction a significant tool in surgical planning and navigation. This work introduces a robotic arm platform for efficient multi-view image acquisition and precise 3D reconstruction in MIS settings. We adapted a laparoscope to a robotic arm and captured ex-vivo images of several ovine organs across varying lighting conditions (operating room and laparoscopic) and trajectories (spherical and laparoscopic). We employed recently released learning-based feature matchers combined with COLMAP to produce our reconstructions. The reconstructions were evaluated against high-precision laser scans for quantitative evaluation. Our results show that whilst reconstructions suffer most under realistic MIS lighting and trajectory, many versions of our pipeline achieve close to sub-millimetre accuracy with an average of 1.05 mm Root Mean Squared Error and 0.82 mm Chamfer distance. Our best reconstruction results occur with operating room lighting and spherical trajectories. Our robotic platform provides a tool for controlled, repeatable multi-view data acquisition for 3D generation in MIS environments which we hope leads to new datasets for training learning-based models.
- Asia > Japan > Honshū > Kantō > Tokyo Metropolis Prefecture > Tokyo (0.14)
- Europe > United Kingdom > England > Greater London > London (0.04)
- Europe > Germany (0.04)
- North America > United States > Oregon > Clackamas County > Wilsonville (0.04)
- Health & Medicine > Surgery (1.00)
- Health & Medicine > Diagnostic Medicine > Imaging (0.68)
Multi-Sensor and Multi-temporal High-Throughput Phenotyping for Monitoring and Early Detection of Water-Limiting Stress in Soybean
Jones, Sarah E., Ayanlade, Timilehin, Fallen, Benjamin, Jubery, Talukder Z., Singh, Arti, Ganapathysubramanian, Baskar, Sarkar, Soumik, Singh, Asheesh K.
Soybean production is susceptible to biotic and abiotic stresses, exacerbated by extreme weather events. Water limiting stress, i.e. drought, emerges as a significant risk for soybean production, underscoring the need for advancements in stress monitoring for crop breeding and production. This project combines multi-modal information to identify the most effective and efficient automated methods to investigate drought response. We investigated a set of diverse soybean accessions using multiple sensors in a time series high-throughput phenotyping manner to: (1) develop a pipeline for rapid classification of soybean drought stress symptoms, and (2) investigate methods for early detection of drought stress. We utilized high-throughput time-series phenotyping using UAVs and sensors in conjunction with machine learning (ML) analytics, which offered a swift and efficient means of phenotyping. The red-edge and green bands were most effective to classify canopy wilting stress. The Red-Edge Chlorophyll Vegetation Index (RECI) successfully differentiated susceptible and tolerant soybean accessions prior to visual symptom development. We report pre-visual detection of soybean wilting using a combination of different vegetation indices. These results can contribute to early stress detection methodologies and rapid classification of drought responses in screening nurseries for breeding and production applications.
- North America > United States > Iowa > Story County > Ames (0.04)
- North America > Puerto Rico > Peñuelas > Peñuelas (0.04)
- North America > United States > Oregon > Clackamas County > Wilsonville (0.04)
- (7 more...)
- Research Report > New Finding (1.00)
- Research Report > Experimental Study (1.00)
- Health & Medicine (1.00)
- Food & Agriculture > Agriculture (1.00)
- Government > Regional Government > North America Government > United States Government (0.93)
- Education (0.93)
High-Precision Fruit Localization Using Active Laser-Camera Scanning: Robust Laser Line Extraction for 2D-3D Transformation
Chu, Pengyu, Li, Zhaojian, Zhang, Kaixiang, Lammers, Kyle, Lu, Renfu
Recent advancements in deep learning-based approaches have led to remarkable progress in fruit detection, enabling robust fruit identification in complex environments. However, much less progress has been made on fruit 3D localization, which is equally crucial for robotic harvesting. Complex fruit shape/orientation, fruit clustering, varying lighting conditions, and occlusions by leaves and branches have greatly restricted existing sensors from achieving accurate fruit localization in the natural orchard environment. In this paper, we report on the design of a novel localization technique, called Active Laser-Camera Scanning (ALACS), to achieve accurate and robust fruit 3D localization. The ALACS hardware setup comprises a red line laser, an RGB color camera, a linear motion slide, and an external RGB-D camera. Leveraging the principles of dynamic-targeting laser-triangulation, ALACS enables precise transformation of the projected 2D laser line from the surface of apples to the 3D positions. To facilitate laser pattern acquisitions, a Laser Line Extraction (LLE) method is proposed for robust and high-precision feature extraction on apples. Comprehensive evaluations of LLE demonstrated its ability to extract precise patterns under variable lighting and occlusion conditions. The ALACS system achieved average apple localization accuracies of 6.9 11.2 mm at distances ranging from 1.0 m to 1.6 m, compared to 21.5 mm by a commercial RealSense RGB-D camera, in an indoor experiment. Orchard evaluations demonstrated that ALACS has achieved a 95% fruit detachment rate versus a 71% rate by the RealSense camera. By overcoming the challenges of apple 3D localization, this research contributes to the advancement of robotic fruit harvesting technology.
- North America > United States > Michigan > Ingham County > Lansing (0.14)
- North America > United States > Michigan > Ingham County > East Lansing (0.14)
- North America > United States > Michigan > Ingham County > Holt (0.14)
- (4 more...)
Active Laser-Camera Scanning for High-Precision Fruit Localization in Robotic Harvesting: System Design and Calibration
Zhang, Kaixiang, Chu, Pengyu, Lammers, Kyle, Li, Zhaojian, Lu, Renfu
Robust and effective fruit detection and localization is essential for robotic harvesting systems. While extensive research efforts have been devoted to improving fruit detection, less emphasis has been placed on the fruit localization aspect, which is a crucial yet challenging task due to limited depth accuracy from existing sensor measurements in the natural orchard environment with variable lighting conditions and foliage/branch occlusions. In this paper, we present the system design and calibration of an Active LAser-Camera Scanner (ALACS), a novel perception module for robust and high-precision fruit localization. The hardware of ALACS mainly consists of a red line laser, an RGB camera, and a linear motion slide, which are seamlessly integrated into an active scanning scheme where a dynamic-targeting laser-triangulation principle is employed. A high-fidelity extrinsic model is developed to pair the laser illumination and the RGB camera, enabling precise depth computation when the target is captured by both sensors. A random sample consensus-based robust calibration scheme is then designed to calibrate the model parameters based on collected data. Comprehensive evaluations are conducted to validate the system model and calibration scheme. The results show that the proposed calibration method can detect and remove data outliers to achieve robust parameter computation, and the calibrated ALACS system is able to achieve high-precision localization with millimeter-level accuracy.
- North America > United States > Michigan > Ingham County > Lansing (0.04)
- North America > United States > Michigan > Ingham County > East Lansing (0.04)
- North America > United States > California > Santa Clara County > Santa Clara (0.04)
- (4 more...)
Multi-growth stage plant recognition: a case study of Palmer amaranth (Amaranthus palmeri) in cotton (Gossypium hirsutum)
Coleman, Guy RY, Kutugata, Matthew, Walsh, Michael J, Bagavathiannan, Muthukumar
Many advanced, image-based precision agricultural technologies for plant breeding, field crop research, and site-specific crop management hinge on the reliable detection and phenotyping of plants across highly variable morphological growth stages. Convolutional neural networks (CNNs) have shown promise for image-based plant phenotyping and weed recognition, but their ability to recognize growth stages, often with stark differences in appearance, is uncertain. Amaranthus palmeri (Palmer amaranth) is a particularly challenging weed plant in cotton (Gossypium hirsutum) production, exhibiting highly variable plant morphology both across growth stages over a growing season, as well as between plants at a given growth stage due to high genetic diversity. In this paper, we investigate eight-class growth stage recognition of A. palmeri in cotton as a challenging model for You Only Look Once (YOLO) architectures. We compare 26 different architecture variants from YOLO v3, v5, v6, v6 3.0, v7, and v8 on an eight-class growth stage dataset of A. palmeri. The highest mAP@[0.5:0.95] for recognition of all growth stage classes was 47.34% achieved by v8-X, with inter-class confusion across visually similar growth stages. With all growth stages grouped as a single class, performance increased, with a maximum mean average precision (mAP@[0.5:0.95]) of 67.05% achieved by v7-Original. Single class recall of up to 81.42% was achieved by v5-X, and precision of up to 89.72% was achieved by v8-X. Class activation maps (CAM) were used to understand model attention on the complex dataset. Fewer classes, grouped by visual or size features improved performance over the ground-truth eight-class dataset. Successful growth stage detection highlights the substantial opportunity for improving plant phenotyping and weed recognition technologies with open-source object detection architectures.
- Oceania > Australia (0.04)
- North America > United States > Texas > Harris County > Houston (0.04)
- North America > United States > Texas > Burleson County (0.04)
- (7 more...)
These drones see in the dark
SAN FRANCISCO – The world's largest drone maker has teamed up with the nation's largest thermal camera company to create ready-to-fly drones that can see in the dark. The drone maker is DJI, a China-based company that currently has about 70% of the world drone market. The camera is by FLIR Systems, a Wilsonville, Ore.-based thermal and infrared imaging company. The collaboration will produce drones that can be used in search-and-rescue, firefighting, security and surveillance. At a news conference Thursday, the companies showed video shot from one of the infrared-capable drones in which several people walking in a pitch black field at night looked like brightly lit light bulbs moving across the rough ground.
- North America > United States > California > San Francisco County > San Francisco (0.28)
- North America > United States > Oregon > Clackamas County > Wilsonville (0.26)
- North America > Canada > Alberta (0.26)
- (2 more...)
FLIR Systems Announces Industry-First Deep Learning-Enabled Camera Family
WILSONVILLE, Ore.--(BUSINESS WIRE)--FLIR Systems, Inc. (NASDAQ: FLIR) today announced the FLIR Firefly camera family, the industry's first deep learning inference-enabled machine vision camera. The FLIR Firefly, which integrates the Intel Movidius Myriad 2 Vision Processing Unit (VPU), is designed for image analysis professionals using deep learning for more accurate decisions, and faster, easier system development. Traditional rules-based software is ideal for straightforward tasks such as barcode reading or checking a manufactured part against specifications. The FLIR Firefly combines a new, affordable machine vision platform with the power of deep learning to address complex and subjective problems such as recognizing faces or classifying the quality of a solar panel. The FLIR Firefly leverages the Intel Movidius Myriad 2 VPU's advanced capabilities in a compact and low-power camera, ideal for embedded and handheld systems.
- North America > United States > Oregon > Clackamas County > Wilsonville (0.38)
- Europe > Germany > Baden-Württemberg > Stuttgart Region > Stuttgart (0.06)
Explosion in Artificial Intelligence Coming for Home Care and Hospitals
The use of artificial intelligence (AI) technology in health care is poised to soar throughout the globe in the coming years, including to support preventive care in people's homes. The report defines artificial intelligence (AI) as providing "a device or software program the ability to interpret complex data, including images, video, text, and speech or other sounds, and act on that interpretation to achieve a goal." The number of monitoring devices capturing patient data for AI purposes such as predictive analytics will increase exponentially, the report found. Specifically, there were 53,000 such devices in use as of 2017, and there will be 3.1 million as of 2021. "This includes the use of AI for home-based preventive care solutions," an ABI press release stated.
These drones see in the dark
Workhorse Group Inc. of Loveland, Ohio, received permission Wednesday from the Federal Aviation Administration to begin testing a delivery drone nicknamed HorseFly that is launched from atop the company's electric trucks. SAN FRANCISCO – The world's largest drone maker has teamed up with the nation's largest thermal camera company to create ready-to-fly drones that can see in the dark. The drone maker is DJI, a China-based company that currently has about 70% of the world drone market. The camera is by FLIR Systems, a Wilsonville, Ore.-based thermal and infrared imaging company. The collaboration will produce drones that can be used in search-and-rescue, firefighting, security and surveillance.
- North America > United States > California > San Francisco County > San Francisco (0.28)
- North America > United States > Oregon > Clackamas County > Wilsonville (0.26)
- North America > United States > Ohio (0.26)
- (2 more...)
- Transportation > Air (1.00)
- Transportation > Ground > Road (0.79)
- Transportation > Electric Vehicle (0.79)
These drones see in the dark
Workhorse Group Inc. of Loveland, Ohio, received permission Wednesday from the Federal Aviation Administration to begin testing a delivery drone nicknamed HorseFly that is launched from atop the company's electric trucks. SAN FRANCISCO – The world's largest drone maker has teamed up with the nation's largest thermal camera company to create ready-to-fly drones that can see in the dark. The drone maker is DJI, a China-based company that currently has about 70% of the world drone market. The camera is by FLIR Systems, a Wilsonville, Ore.-based thermal and infrared imaging company. The collaboration will produce drones that can be used in search-and-rescue, firefighting, security and surveillance.
- North America > United States > California > San Francisco County > San Francisco (0.28)
- North America > United States > Oregon > Clackamas County > Wilsonville (0.26)
- North America > United States > Ohio (0.26)
- (2 more...)
- Transportation > Air (1.00)
- Transportation > Ground > Road (0.79)
- Transportation > Electric Vehicle (0.79)